Читать онлайн книгу "How to be Alone"

How to be Alone
Jonathan Franzen


Jonathan Franzen's �The Corrections’ was the best-loved and most written-about novel of 2001. Nearly every in-depth review of it discussed what became known as 'The Harper's Essay,' Franzen's controversial 1996 look at the fate of the novel. This essay is reprinted for the first time in �How to be Alone’, alongside the personal essays and painstaking, often funny reportage that earned Franzen a wide readership before the success of �The Corrections’. Although his subjects range from the sex-advice industry to the way a supermax prison works, each piece wrestles with familiar themes of Franzen's writing: the erosion of civic life and private dignity, and the hidden persistence of loneliness, in postmodern, imperial America. Recent pieces include a moving essay on his father's struggle with Alzheimer's disease and a rueful account of Franzen's brief tenure as an Oprah Winfrey author.As a collection, these essays record what Franzen calls 'a movement away from an angry and frightened isolation toward an acceptance – even a celebration – of being a reader and a writer.' At the same time they show the wry distrust of the claims of technology and psychology, the love-hate relationship with consumerism, and the subversive belief in the tragic shape of the individual life that help make Franzen one of the sharpest, toughest-minded, and most entertaining social critics at work today.









JONATHAN FRANZEN

HOW TO BE ALONE

ESSAYS













Copyright (#ulink_11d25480-cb02-5029-a464-5ec38a11cc41)


Fourth Estate An imprint of HarperCollinsPublishers Ltd. 1 London Bridge Street London SE1 9GF

www.harpercollins.co.uk (http://www.harpercollins.co.uk)

First published in Great Britain by Fourth Estate in 2002

Copyright В©Jonathan Franzen 2002

Jonathan Franzen asserts the moral right to be identified as the author of this work

A catalogue record for this book is available from the British Library

All rights reserved under International and Pan-American Copyright Conventions. By payment of the required fees, you have been granted the nonexclusive, nontransferable right to access and read the text of this ebook on-screen. No part of this text may be reproduced, transmitted, downloaded, decompiled, reverse engineered, or stored in or introduced into any information storage and retrieval system, in any form or by any means, whether electronic or mechanical, now known or hereinafter invented, without the express written permission of HarperCollins ebooks

HarperCollinsPublishers has made every reasonable effort to ensure that any picture content and written content in this ebook has been included or removed in accordance with the contractual and technological constraints in operation at the time of publication

Source ISBN: 9780007153589

Ebook Edition В© OCTOBER 2012 ISBN: 9780007389063

Version: 2017-06-09




Praise (#ulink_ddf66301-9d63-59f7-8aaa-23a30e614a1c)


From the reviews of How to Be Alone:

�Stunning … Each page is studded with irresistible writing which leaves you breathless for more. Franzen’s strength is his ability to combine a rigorous intellectual approach with an upbeat energy, using language which touches the heart as surely as the head’

Time Out

�How to Be Alone reveals [Franzen] to be an impressively versatile non-fiction writer, equally sure of touch and tone whether reporting on the US prison system, reviewing sex self-help books, or joining the debates about privacy and smoking with a rare acuity and sanity’

Guardian

�Franzen possesses an incredibly strong narrative voice – his tone is packed with confidence, exuberance and energy … In its ambition, its intellectual vigour and its knowledge, [How to Be Alone] more than justifies its place on any bookshelf … At his best, [Franzen] is nothing less than mesmerising’

Sunday Business Post

�Full of quips and aphorisms, [Franzen] is the Montaigne of our times, both inventively comic and deadly serious … Beguiling … You love him for his scrupulous honesty and for his arresting originality’

Harpers & Queen

�Franzen’s almost casual brilliance with language can leave you giddy with the thrill’

Irish Times

�An engrossing read where the combination of plot with technical detail is exquisite … ideal for anyone who enjoys intelligent writing’

Herald, Glasgow

�Much thought and craft has gone into these essays … [Franzen’s] writing is an antidote in itself to the gloom it so eloquently describes’

Sunday Telegraph




DEDICATION (#ulink_94b5d4c0-f208-5511-b3f6-c5cf37482c8f)


FOR KATHY CHETKOVICH


Contents

COVER (#u227a1603-965f-5a07-9f8c-deea7f26cb7c)

TITLE PAGE (#u456529f1-aac1-56ef-9d76-6ca8d31d4f66)

COPYRIGHT (#u950a63b8-9933-5a8f-82cc-83a6db6f2acf)

PRAISE (#u80ef20b1-4446-5fd8-b575-bb04177d798c)

DEDICATION (#u8e7d8f4e-e016-5283-ac9b-601a414229a0)

A WORD ABOUT THIS BOOK (#u8d92c1b7-3ba0-580f-9144-10e73fe51223)

MY FATHER’S BRAIN (#ubc09e57f-a3cc-55c3-b925-9253849ed156)

IMPERIAL BEDROOM (#ue150bf11-8d17-5da8-af43-f0bcf5d6a385)

WHY BOTHER? (#u3a0fe77f-06ab-5592-91dc-e91834a851f6)

LOST IN THE MAIL (#litres_trial_promo)

ERIKA IMPORTS (#litres_trial_promo)

SIFTING THE ASHES (#litres_trial_promo)

THE READER IN EXILE (#litres_trial_promo)

FIRST CITY (#litres_trial_promo)

SCAVENGING (#litres_trial_promo)

CONTROL UNITS (#litres_trial_promo)

MR. DIFFICULT (#litres_trial_promo)

BOOKS IN BED (#litres_trial_promo)

MEET ME IN ST. LOUIS (#litres_trial_promo)

INAUGURATION DAY, JANUARY 2001 (#litres_trial_promo)

KEEP READING (#litres_trial_promo)

ABOUT THE AUTHOR (#litres_trial_promo)

ALSO BY JONATHAN FRANZEN (#litres_trial_promo)

ABOUT THE PUBLISHER (#litres_trial_promo)




A WORD ABOUT THIS BOOK (#ulink_e33cedab-9481-5a1b-a5e3-d0fa3d19f3db)


MY THIRD NOVEL, The Corrections, which I’d worked on for many years, was published a week before the World Trade Center fell. This was a time when it seemed that the voices of self and commerce ought to fall silent—a time when you wanted, in Nick Carraway’s phrase, “the world to be in uniform and at a sort of moral attention forever.” Nevertheless, business is business. Within forty-eight hours of the calamity, I was giving interviews again.

My interviewers were particularly interested in what they referred to as “the Harper’s essay.” (Nobody used the original title, “Perchance to Dream,” that the magazine’s editors had given it.) Interviews typically began with the question: “In your Harper’s essay in 1996, you promised that your third book would be a big social novel that would engage with mainstream culture and rejuvenate American literature; do you think you’ve kept that promise with The Corrections?” To each succeeding interviewer I explained that, no, to the contrary, I had barely mentioned my third novel in the essay; that the notion of a “promise” had been invented out of thin air by an editor or a headline writer at the Times Sunday Magazine; and that, in fact, far from promising to write a big social novel that would bring news to the mainstream, I’d taken the essay as an opportunity to renounce that variety of ambition. Because most interviewers hadn’t read the essay, and because the few who had read it seemed to have misunderstood it, I became practiced at giving a clear, concise précis of its argument; by the time I did my hundredth or hundred-tenth interview, in November, I’d worked up a nice little corrective spiel that began, “No, actually, the Harper’s essay was about abandoning my sense of social responsibility as a novelist and learning to write fiction for the fun and entertainment of it …” I was puzzled, and more than a little aggrieved, that nobody seemed able to discern this simple, clear idea in the text. How willfully stupid, I thought, these media people were!

In December I decided to pull together an essay collection that would include the complete text of “Perchance to Dream” and make clear what I had and hadn’t said in it. But when I opened the April 1996 Harper’s I found an essay, evidently written by me, that began with a five-thousand-word complaint of such painful stridency and tenuous logic that even I couldn’t quite follow it. In the five years since I’d written the essay, I’d managed to forget that I used to be a very angry and theory-minded person. I used to consider it apocalyptically worrisome that Americans watch a lot of TV and don’t read much Henry James. I used to be the kind of religious nut who convinces himself that, because the world doesn’t share his particular faith (for me, a faith in literature), we must be living in End Times. I used to think that our American political economy was a vast cabal whose specific aim was to thwart my artistic ambitions, exterminate all that I found lovely in civilization, and also rape and murder the planet in the process. The first third of the Harper’s essay was written from this place of anger and despair, in a tone of high theoretical dudgeon that made me cringe a little now.

It’s true that, even in 1996, I intended the essay to document a stalled novelist’s escape from the prison of his angry thoughts. And so part of me is inclined now to reprint the thing exactly as it first appeared, as a record of my former zealotry. I’m guessing, though, that most readers will have limited appetite for pronouncements such as



It seemed clear to me that if anybody who mattered in business or government believed there was a future in books, we would not have been witnessing such a frenzy in Washington and on Wall Street to raise half a trillion dollars for an Infobahn whose proponents paid lip service to the devastation it would wreak on reading (“You have to get used to reading on a screen”) but could not conceal their indifference to the prospect.

Because a little of this goes a long way, I’ve exercised my authorial license and cut the essay by a quarter and revised it throughout. (I’ve also retitled it “Why Bother?”) Although it’s still very long, my hope is that it’s less taxing to read now, more straightforward in its movement. If nothing else, I want to be able to point to it and say, “See, the argument is really quite clear and simple, just like I said!”

What goes for the Harper’s essay goes for this collection as a whole. I intend this book, in part, as a record of a movement away from an angry and frightened isolation toward an acceptance—even a celebration—of being a reader and a writer. Not that there’s not still plenty to be mad and scared about. Our national thirst for petroleum, which has already produced two Bush presidencies and an ugly Gulf War, is now threatening to lead us into an open-ended long-term conflict in Central Asia. Although you wouldn’t have thought it possible, Americans seem to be asking even fewer questions about their government today than in 1991, and the major media sound even more monolithically jingoistic. While Congress yet again votes against applying easily achievable fuel-efficiency standards to SUVs, the president of Ford Motor Company can be seen patriotically defending these vehicles in a TV ad, avowing that Americans must never accept “boundaries of any kind.”

With so much fresh outrageousness being manufactured daily, I’ve chosen to do only minimal tinkering with the other essays in this book. “First City” reads a little differently without the World Trade Center; “Imperial Bedroom” was written before John Ashcroft came to power with his seeming indifference to personal liberties; anthrax has lent further poignancy to the woes of the United States Postal Service, as described in “Lost in the Mail"; and Oprah Winfrey’s disinvitation of me from her Book Club makes the descriptive word “elitist” fluoresce in the several essays where it appears. But the local particulars of content matter less to me than the underlying investigation in all these essays: the problem of preserving individuality and complexity in a noisy and distracting mass culture: the question of how to be alone.

[2002]




MY FATHER’S BRAIN (#ulink_7572a0aa-ff17-5ed4-8edb-0756861a8dc5)


HERE’S A MEMORY. On an overcast morning in February 1996, I received in the mail from my mother, in St. Louis, a Valentine’s package containing one pinkly romantic greeting card, two four-ounce Mr. Goodbars, one hollow red filigree heart on a loop of thread, and one copy of a neuropathologist’s report on my father’s brain autopsy.

I remember the bright gray winter light that morning. I remember leaving the candy, the card, and the ornament in my living room, taking the autopsy report into my bedroom, and sitting down to read it. The brain (it began) weighed 1,255 gm and showed parasagittal atrophy with sulcal widening. I remember translating grams into pounds and pounds into the familiar shrink-wrapped equivalents in a supermarket meat case. I remember putting the report back into its envelope without reading any further.

Some years before he died, my father had participated in a study of memory and aging sponsored by Washington University, and one of the perks for participants was a postmortem brain autopsy, free of charge. I suspect that the study offered other perks of monitoring and treatment which had led my mother, who loved freebies of all kinds, to insist that my father volunteer for it. Thrift was also probably her only conscious motive for including the autopsy report in my Valentine’s package. She was saving thirty-two cents’ postage.

My clearest memories of that February morning are visual and spatial: the yellow Mr. Goodbar, my shift from living room to bedroom, the late-morning light of a season as far from the winter solstice as from spring. I’m aware, however, that even these memories aren’t to be trusted. According to the latest theories, which are based on a wealth of neurological and psychological research in the last few decades, the brain is not an album in which memories are stored discretely like unchanging photographs. A memory is, instead, in the phrase of the psychologist Daniel L. Schachter, a “temporary constellation” of activity—a necessarily approximate excitation of neural circuits that bind a set of sensory images and semantic data into the momentary sensation of a remembered whole. These images and data are seldom the exclusive property of one particular memory. Indeed, even as my experience on that Valentine’s morning was unfolding, my brain was relying on pre-existing categories of “red” and “heart” and “Mr. Goodbar”; the gray sky in my windows was familiar from a thousand other winter mornings; and I already had millions of neurons devoted to a picture of my mother—her stinginess with postage, her romantic attachments to her children, her lingering anger toward my father, her weird lack of tact, and so on. What my memory of that morning therefore consists of, according to the latest models, is a set of hardwired neuronal connections among the pertinent regions of the brain, and a predisposition for the entire constellation to light up—chemically, electrically—when any one part of the circuit is stimulated. Speak the words “Mr. Goodbar” and ask me to free-associate, and if I don’t say “Diane Keaton” I will surely say “brain autopsy.”

My Valentine’s memory would work this way even if I were dredging it up now for the first time ever. But the fact is that I’ve re-remembered that February morning countless times since then. I’ve told the story to my brothers. I’ve offered it as an Outrageous Mother Incident to friends of mine who enjoy that kind of thing. I’ve even, shameful to report, told people I hardly know at all. Each succeeding recollection and retelling reinforces the constellation of images and knowledge that constitute the memory. At the cellular level, according to neuroscientists, I’m burning the memory in a little deeper each time, strengthening the dendritic connections among its components, further encouraging the firing of that specific set of synapses. One of the great adaptive virtues of our brains, the feature that makes our gray matter so much smarter than any machine yet devised (my laptop’s cluttered hard drive or a World Wide Web that insists on recalling, in pellucid detail, a Beverly Hills 90210 fan site last updated on 11/20/98), is our ability to forget almost everything that has ever happened to us. I retain general, largely categorical memories of the past (a year spent in Spain; various visits to Indian restaurants on East Sixth Street) but relatively few specific episodic memories. Those memories that I do retain I tend to revisit and, thereby, strengthen. They become literally—morphologically, electrochemically—part of the architecture of my brain.

This model of memory, which I’ve presented here in a rather loose layperson’s summary, excites the amateur scientist in me. It feels true to the twinned fuzziness and richness of my own memories, and it inspires awe with its image of neural networks effortlessly self-coordinating, in a massively parallel way, to create my ghostly consciousness and my remarkably sturdy sense of self. It seems to me lovely and postmodern. The human brain is a web of a hundred billion neurons, maybe as many as two hundred billion, with trillions of axons and dendrites exchanging quadrillions of messages by way of at least fifty different chemical transmitters. The organ with which we observe and make sense of the universe is, by a comfortable margin, the most complex object we know of in that universe.

And yet it’s also a lump of meat. At some point, maybe later on that same Valentine’s Day, I forced myself to read the entire pathology report. It included a “Microscopic Description” of my father’s brain:



Sections of the frontal, parietal, occipital, and temporal cerebral cortices showed numerous senile plaques, prominently diffuse type, with minimal numbers of neurofibrillary tangles. Cortical Lewy bodies were easily detected in H&E stained material. The amygdala demonstrated plaques, occasional tangles and mild neuron loss.

In the notice that we had run in local newspapers nine months earlier, my mother insisted that we say my father had died “after long illness.” She liked the phrase’s formality and reticence, but it was hard not to hear her grievance in it as well, her emphasis on long. The pathologist’s identification of senile plaques in my father’s brain served to confirm, as only an autopsy could, the fact with which she’d struggled daily for many years: like millions of other Americans, my father had had Alzheimer’s disease.

This was his disease. It was also, you could argue, his story. But you have to let me tell it.

ALZHEIMER’S IS A DISEASE of classically “insidious onset.” Since even healthy people become more forgetful as they age, there’s no way to pinpoint the first memory to fall victim to it. The problem was especially vexed in the case of my father, who not only was depressive and reserved and slightly deaf but also was taking strong medicines for other ailments. For a long time it was possible to chalk up his non sequiturs to his hearing impairment, his forgetfulness to his depression, his hallucinations to his medicines; and chalk them up we did.

My memories of the years of my father’s initial decline are vividly about things other than him. Indeed, I’m somewhat appalled by how large I loom in my own memories, how peripheral my parents are. But I was living far from home in those years. My information came mainly from my mother’s complaints about my father, and these complaints I took with a grain of salt; she’d been complaining to me pretty much all my life.

My parents’ marriage was, it’s safe to say, less than happy. They stayed together for the sake of their children and for want of hope that divorce would make them any happier. As long as my father was working, they enjoyed autonomy in their respective fiefdoms of home and workplace, but after he retired, in 1981, at the age of sixty-six, they commenced a round-the-clock performance of No Exit in their comfortably furnished suburban house. I arrived for brief visits like a U.N. peacekeeping force to which each side passionately presented its case against the other.

Unlike my mother, who was hospitalized nearly thirty times in her life, my father had perfect health until he retired. His parents and uncles had lived into their eighties and nineties, and he, Earl Franzen, fully expected to be around at ninety “to see,” as he liked to say, “how things turn out.” (His anagramatic namesake Lear imagined his last years in similar terms: listening to “court news,” with Cordelia, to see “who loses and who wins, who’s in, who’s out.”) My father had no hobbies and few pleasures besides eating meals, seeing his children, and playing bridge, but he did take a narrative interest in life. He watched a staggering amount of TV news. His ambition for old age was to follow the unfolding histories of the nation and his children for as long as he could.

The passivity of this ambition, the sameness of his days, tended to make him invisible to me. From the early years of his mental decline I can dredge up exactly one direct memory: watching him, toward the end of the eighties, struggle and fail to calculate the tip on a restaurant bill.

Fortunately, my mother was a great writer of letters. My father’s passivity, which I regarded as regrettable but not really any of my business, was a source of bitter disappointment to her. As late as the fall of 1989—a season in which, according to her letters, my father was still playing golf and undertaking major home repairs—the terms of her complaints remained strictly personal:



It is extremely difficult living with a very unhappy person when you know you must be the major cause of the unhappiness. Decades ago when Dad told me he didn’t believe there is such a thing as love (that sex is a “trap”) and that he was not cut out to be a “happy” person I should have been smart enough to realize there was no hope for a relationship satisfactory to me. But I was busy & involved with my children and friends I loved and I guess, like Scarlett O’Hara, I told myself I would “worry about that tomorrow.”

This letter dates from a period during which the theater of my parents’ war had shifted to the issue of my father’s hearing impairment. My mother maintained that it was inconsiderate not to wear a hearing aid; my father complained that other people lacked the consideration to “speak up.” The battle culminated Pyrrhically in his purchase of a hearing aid that he then declined to wear. Here again, my mother constructed a moral story of his “stubbornness” and “vanity” and “defeatism”; but it’s hard not to suspect, in hindsight, that his faulty ears were already serving to camouflage more serious trouble.

A letter from January 1990 contains my mother’s first written reference to this trouble:



Last week one day he had to skip his breakfast time medication in order to take some motor skills tests at Wash U. where he is in the Memory & Ageing study. That night I awakened to the sound of his electric razor, looked at the clock & he was in the bathroom shaving at 2:30 AM.

Within a few months my father was making so many mistakes that my mother was forced to entertain other explanations:



Either he’s stressed or not concentrating or having some mental deterioration but there have been quite a few incidents recently that really worry me. He keeps leaving the car door open or the lights on & twice in one week we had to call triple A & have them come out & charge the battery (now I’ve posted signs in the garage & that seems to have helped) … I really don’t like the idea of leaving him in the house alone for more than a short while.

My mother’s fear of leaving him alone assumed greater urgency as the year wore on. Her right knee was worn out, and, because she already had a steel plate in her leg from an earlier fracture, she was facing complicated surgery followed by prolonged recovery and rehab. Her letters from late 1990 and early 1991 are marked by paragraphs of agonizing over whether to have surgery and how to manage my father if she did.



Were he in the house alone more than overnight with me in the hospital I would be an absolute basket case as he leaves the water running, the stove on at times, lights on everywhere, etc.… I check & recheck as much as I can on most things lately but even so many of our affairs are in a state of confusion & what really is hardest is his resentment of my intrusion—“stay out of my affairs!!!” He does not accept or realize my wanting to be helpful & that is the hardest thing of all for me.

At the time, I’d recently finished my second novel, and so I offered to stay with my father while my mother had her operation. To steer clear of his pride, she and I agreed to pretend that I was coming for her sake, not his. What’s odd, though, is that I was only half-pretending. My mother’s characterization of my father’s incapacity was compelling, but so was my father’s portrayal of my mother as an alarmist nag. I went to St. Louis because, for her, his incapacity was absolutely real; once there, I behaved as if, for me, it absolutely wasn’t.

Just as she’d feared, my mother was in the hospital for nearly five weeks. Strangely, although I’d never lived alone with my father for so long and never would again, I can now remember almost nothing specific about my stay with him; I have a general impression that he was somewhat quiet, maybe, but otherwise completely normal. Here, you might think, was a direct contradiction of my mother’s earlier reports. And yet I have no memory of being bothered by the contradiction. What I do have is a copy of a letter that I wrote to a friend while in St. Louis. In the letter, I mention that my father has had his medication adjusted and now everything is fine.

Wishful thinking? Yes, to some extent. But one of the basic features of the mind is its keenness to construct wholes out of fragmentary parts. We all have a literal blind spot in our vision where the optic nerve attaches to the retina, but our brain unfailingly registers a seamless world around us. We catch part of a word and hear the whole. We see expressive faces in floral-pattern upholstery; we constantly fill in blanks. In a similar way, I think I was inclined to interpolate across my father’s silences and mental absences and to persist in seeing him as the same old wholly whole Earl Franzen. I still needed him to be an actor in my story of myself. In my letter to my friend, I describe a morning rehearsal of the St. Louis Symphony that my mother insisted that my father and I attend so as not to waste her free tickets to it. After the first half of the session, in which the very young Midori nailed the Sibelius violin concerto, my father sprang from his seat with miserable geriatric agitation. “So,” he said, “we’ll go now.” I knew better than to ask him to sit through the Charles Ives symphony that was coming, but 1 hated him for what I took to be his philistinism. On the drive home, he had one comment about Midori and Sibelius. “I don’t understand that music,” he said. “What do they do—memorize it?”

LATER THAT SPRING, my father was diagnosed with a small, slow-growing cancer in his prostate. His doctors recommended that he not bother treating it, but he insisted on a course of radiation. With a kind of referred wisdom about his own mental state, he became terrified that something was dreadfully wrong with him: that he would not, after all, survive into his nineties. My mother, whose knee continued to bleed internally six months after her operation, had little patience with what she considered his hypochondria. In September 1991 she wrote:



I’m relieved to have Dad started on his radiation therapy & it forces him to get out of the house every day [inserted, here, a smiley face]—a big plus. He got to the point where he was so nervous, so worried, so depressed I knew he had to make some decision. Actually, being so sedentary now (content to do nothing), he has had too much time to worry & think about himself—he NEEDS distractions! … More & more I feel the greatest attributes anyone can have are (1), a positive attitude & (2), a sense of humor—wish Dad had them.

There ensued some months of relative optimism. The cancer was eradicated, my mother’s knee finally improved, and her native hopefulness returned to her letters. She reported that my father had taken first place in a game of bridge: “With his confusion cleared up & his less conservative approach to the game he is doing remarkably well & it’s about the only thing he enjoys (& can stay awake for!).” But my father’s anxiety about his health did not abate; he had stomach pains that he was convinced were caused by cancer. Gradually, the import of the story my mother was telling me migrated from the personal and the moral toward the psychiatric. “The past six months we have lost so many friends it is very unsettling—part of Dad’s nervousness & depression I’m sure,” she wrote in February 1992. The letter continued:



Dad’s internist, Dr. Rouse, has about concluded what I have felt all a long regarding Dad’s stomach discomfort (he’s ruled out all clinical possibilities). Dad is (1) terribly nervous, (2) terribly depressed & I hope Dr. Rouse will put him on an anti-depressant. I know there has to be help for this … There have been disturbing, distressing things in our lives the past year, I know that very well, but Dad’s mental condition is hurting him physically & if he won’t go for counseling (suggested by Dr. Weiss) perhaps he now will accept pills or whatever it takes for nervousness & depression.

For a while, the phrase “nervousness & depression” was a fixture of her letters. Prozac briefly seemed to lift my father’s spirits, but the effects were short-lived. Finally, in July 1992, to my surprise, he agreed to see a psychiatrist.

My father had always been supremely suspicious of psychiatry. He viewed therapy as an invasion of privacy, mental health as a matter of self-discipline, and my mother’s increasingly pointed suggestions that he “talk to someone” as acts of aggression—little lobbed grenades of blame for their unhappiness as a couple. It was a measure of his desperation that he voluntarily set foot in a psychiatrist’s office.

In October, when I stopped in St. Louis on my way to Italy, I asked him about his sessions with the doctor. He made a hopeless gesture with his hands. “He’s extremely able,” he said. “But I’m afraid he’s written me off.”

The idea of anybody writing my father off was more than I could stand. From Italy I sent the psychiatrist a three-page appeal for reconsideration, but even as I was writing it the roof was caving in at home. “Much as I dislike telling you,” my mother wrote in a letter faxed to Italy, “Dad has regressed terribly. Medicine for the urinary problem a urologist is treating in combination with medication for depression and nervousness blew his mind again and the hallucinating, etc. was terrible.” There had been a weekend with my Uncle Erv in Indiana, where my father, removed from his familiar surroundings, unleashed a night of madness that culminated in my uncle’s shouting into his face, “Earl, my God, it’s your brother, Erv, we slept in the same bed!” Back in St. Louis, my father had begun to rage against the retired lady, Mrs. Pryble, whom my mother had engaged to sit with him two mornings a week while she ran errands. He didn’t see why he needed sitting, and, even assuming that he did need sitting, he didn’t see why a stranger, rather than his wife, should be doing it. He’d become a classic “sundowner,” dozing through the day and rampaging in the wee hours.

There followed a dismal holiday visit during which my wife and I finally intervened on my mother’s behalf and put her in touch with a geriatric social worker, and my mother urged my wife and me to tire my father out so that he would sleep through the night without psychotic incident, and my father sat stone-faced by the fireplace or told grim stories of his childhood while my mother fretted about the expense, the prohibitive expense, of sessions with a social worker. But even then, as far as I can remember, nobody ever said “dementia.” In all my mother’s letters to me, the word “Alzheimer’s” appears exactly once, in reference to an old German woman I worked for as a teenager.

I REMEMBER my suspicion and annoyance, fifteen years ago, when the term “Alzheimer’s disease” was first achieving currency. It seemed to me another instance of the medicalization of human experience, the latest entry in the ever-expanding nomenclature of victimhood. To my mother’s news about my old employer I replied: “What you describe sounds like the same old Erika, only quite a bit worse, and that’s not how Alzheimer’s is supposed to work, is it? I spend a few minutes every month fretting about ordinary mental illness being trendily misdiagnosed as Alzheimer’s.”

From my current vantage, where I spend a few minutes every month fretting about what a self-righteous thirty-year-old I was, I can see my reluctance to apply the term “Alzheimer’s” to my father as a way of protecting the specificity of Earl Franzen from the generality of a nameable condition. Conditions have symptoms; symptoms point to the organic basis of everything we are. They point to the brain as meat. And, where I ought to recognize that, yes, the brain is meat, I seem instead to maintain a blind spot across which I then interpolate stories that emphasize the more soul-like aspects of the self. Seeing my afflicted father as a set of organic symptoms would invite me to understand the healthy Earl Franzen (and the healthy me) in symptomatic terms as well—to reduce our beloved personalities to finite sets of neurochemical coordinates. Who wants a story of life like that?

Even now, I feel uneasy when I gather facts about Alzheimer’s. Reading, for example, David Shenk’s book The Forgetting: Alzheimer’s: Portrait of an Epidemic, I’m reminded that when my father got lost in his own neighborhood, or forgot to flush the toilet, he was exhibiting symptoms identical to those of millions of other afflicted people. There can be comfort in having company like this, but I’m sorry to see the personal significance drained from certain mistakes of my father’s, like his confusion of my mother with her mother, which struck me at the time as singular and orphic, and from which I gleaned all manner of important new insights into my parents’ marriage. My sense of private selfhood turns out to have been illusory.

Senile dementia has been around for as long as people have had the means of recording it. While the average human life span remained short and old age was a comparative rarity, senility was considered a natural by-product of aging—perhaps the result of sclerotic cerebral arteries. The young German neuropathologist Alois Alzheimer believed he was witnessing an entirely new variety of mental illness when, in 1901, he admitted to his clinic a fifty-one-year-old woman, Auguste D., who was suffering from bizarre mood swings and severe memory loss and who, in Alzheimer’s initial examination of her, gave problematic answers to his questions:



“What is your name?”

“Auguste.”

“Last name?”

“Auguste.”

“What is your husband’s name?”

“Auguste, I think.”



When Auguste D. died in an institution, four years later, Alzheimer availed himself of recent advances in microscopy and tissue-staining and was able to discern, in slides of her brain tissue, the striking dual pathology of her disease: countless sticky-looking globs of “plaque” and countless neurons engulfed by “tangles” of neuronal fibrils. Alzheimer’s findings greatly interested his patron Emil Kraepelin, then the dean of German psychiatry, who was engaged in a fierce scientific battle with Sigmund Freud and Freud’s psycholiterary theories of mental illness. To Kraepelin, Alzheimer’s plaques and tangles provided welcome clinical support for his contention that mental illness was fundamentally organic. In his Handbook of Psychiatry he dubbed Auguste D.’s condition Morbus Alzheimer.

For six decades after Alois Alzheimer’s autopsy of Auguste D., even as breakthroughs in disease prevention and treatment were adding fifteen years to life expectancy in developed nations, Alzheimer’s continued to be viewed as a medical rarity à la Huntington’s disease. David Shenk tells the story of an American neuropathologist named Meta Naumann who, in the early fifties, autopsied the brains of 210 victims of senile dementia and found sclerotic arteries in few of them, plaques and tangles in the majority. Here was ironclad evidence that Alzheimer’s was far more common than anyone had guessed; but Naumann’s work appears to have persuaded no one. “They felt that Meta was talking nonsense,” her husband recalled.

The scientific community simply wasn’t ready to consider that senile dementia might be more than a natural consequence of aging. In the early fifties there was no self-conscious category of “seniors,” no explosion of Sun Belt retirement communities, no AARP, no Early Bird tradition at low-end restaurants; and scientific thinking reflected these social realities. Not until the seventies did conditions become ripe for a reinterpretation of senile dementia. By then, as Shenk says, “so many people were living so long that senility didn’t feel so normal or acceptable anymore.” Congress passed the Research on Aging Act in 1974, and established the National Institute on Aging, for which funding soon mushroomed. By the end of the eighties, at the crest of my annoyance with the clinical term and its sudden ubiquity, Alzheimer’s had achieved the same social and medical standing as heart disease or cancer—and had the research funding levels to show for it.

What happened with Alzheimer’s in the seventies and eighties wasn’t simply a diagnostic paradigm shift. The number of new cases really is soaring. As fewer and fewer people drop dead of heart attacks or die of infections, more and more survive to become demented. Alzheimer’s patients in nursing homes live much longer than other patients, at a cost of at least forty thousand dollars annually per patient; until they’re institutionalized, they increasingly derange the lives of family members charged with caring for them. Already, five million Americans have the disease, and the number could rise to fifteen million by 2050.

Because there’s so much money in chronic illness, drug companies are investing feverishly in proprietary Alzheimer’s research while publicly funded scientists file for patents on the side. But because the science of the disease remains cloudy (a functioning brain is not a lot more accessible than the center of the earth or the edge of the universe), nobody can be sure which avenues of research will lead to effective treatments. Overall, the feeling in the field seems to be that if you’re under fifty you can reasonably expect to be offered effective drugs for Alzheimer’s by the time you need them. Then again, twenty years ago, many cancer researchers were predicting a cure within twenty years.

David Shenk, who is comfortably under fifty, makes the case in The Forgetting that a cure for senile dementia might not be an entirely unmitigated blessing. He notes, for example, that one striking peculiarity of the disease is that its “sufferers” often suffer less and less as it progresses. Caring for an Alzheimer’s patient is gruelingly repetitious precisely because the patient himself has lost the cerebral equipment to experience anything as a repetition. Shenk quotes patients who speak of “something delicious in oblivion” and who report an enhancement of their sensory pleasures as they come to dwell in an eternal, pastless Now. If your short-term memory is shot, you don’t remember, when you stoop to smell a rose, that you’ve been stooping to smell the same rose all morning.

As the psychiatrist Barry Reisberg first observed twenty years ago, the decline of an Alzheimer’s patient mirrors in reverse the neurological development of a child. The earliest capacities a child develops—raising the head (at one to three months), smiling (two to four months), sitting up unassisted (six to ten months)—are the last capacities an Alzheimer’s patient loses. Brain development in a growing child is consolidated through a process called myelinization, wherein the axonal connections among neurons are gradually strengthened by sheathings of the fatty substance myelin. Apparently, since the last regions of the child’s brain to mature remain the least myelinated, they’re the regions most vulnerable to the insult of Alzheimer’s. The hippocampus, which processes short-term memories into long-term, is very slow to myelinize. This is why we’re unable to form permanent episodic memories before the age of three or four, and why the hippocampus is where the plaques and tangles of Alzheimer’s first appear. Hence the ghostly apparition of the middle-stage patient who continues to be able to walk and feed herself even as she remembers nothing from hour to hour. The inner child isn’t inner anymore. Neurologically speaking, we’re looking at a one-year-old.

Although Shenk tries valiantly to see a boon in the Alzheimer’s patient’s childish relief from responsibility and childlike focus on the Now, I’m mindful that becoming a baby again was the last thing my father wanted. The stories he told from his childhood in northern Minnesota were mainly (as befits a depressive’s recollections) horrible: brutal father, unfair mother, endless chores, backwoods poverty, family betrayals, hideous accidents. He told me more than once, after his retirement, that his greatest pleasure in life had been going to work as an adult in the company of other men who valued his abilities. My father was an intensely private person, and privacy for him had the connotation of keeping the shameful content of one’s interior life out of public sight. Could there have been a worse disease for him than Alzheimer’s? In its early stages, it worked to dissolve the personal connections that had saved him from the worst of his depressive isolation. In its later stages it robbed him of the sheathing of adulthood, the means to hide the child inside him. I wish he’d had a heart attack instead.

Still, shaky though Shenk’s arguments for the brighter side of Alzheimer’s may be, his core contention is harder to dismiss: senility is not merely an erasure of meaning but a source of meaning. For my mother, the losses of Alzheimer’s both amplified and reversed long-standing patterns in her marriage. My father had always refused to open himself to her, and now, increasingly, he couldn’t open himself. To my mother, he remained the same Earl Franzen napping in the den and failing to hear. She, paradoxically, was the one who slowly and surely lost her self, living with a man who mistook her for her mother, forgot every fact he’d ever known about her, and finally ceased to speak her name. He, who had always insisted on being the boss in the marriage, the maker of decisions, the adult protector of the childlike wife, now couldn’t help behaving like the child. Now the unseemly outbursts were his, not my mother’s. Now she ferried him around town the way she’d once ferried me and my brothers. Task by task, she took charge of their life. And so, although my father’s “long illness” was a crushing strain and disappointment to her, it was also an opportunity to grow slowly into an autonomy she’d never been allowed: to settle some very old scores.

As for me, once I accepted the scope of the disaster, the sheer duration of Alzheimer’s forced me into unexpectedly welcome closer contact with my mother. I learned, as I might not have otherwise, that I could seriously rely on my brothers and that they could rely on me. And, strangely, although I’d always prized my intelligence and sanity and self-consciousness, I found that watching my father lose all three made me less afraid of losing them myself. I became a little less afraid in general. A bad door opened, and I found I was able to walk through it.

THE DOOR in question was on the fourth floor of Barnes Hospital, in St. Louis. About six weeks after my wife and I had put my mother in touch with the social worker and gone back east, my oldest brother and my father’s doctors persuaded him to enter the hospital for testing. The idea was to get all the medications out of his bloodstream and see what we were dealing with underneath. My mother helped him check in and spent the afternoon settling him into his room. He was still his usual, semipresent self when she left for dinner, but that evening, at home, she began to get calls from the hospital, first from my father, who demanded that she come and remove him from “this hotel,” and then from nurses who reported that he’d become belligerent. When she returned to the hospital in the morning, she found him altogether gone—raving mad, profoundly disoriented.

I flew back to St. Louis a week later. My mother took me straight from the airport to the hospital. While she spoke to the nurses, I went to my father’s room and found him in bed, wide awake. I said hello. He made frantic shushing gestures and beckoned me to his pillow. I leaned over him and he asked me, in a husky whisper, to keep my voice down because “they” were “listening.” I asked him who “they” were. He couldn’t tell me, but his eyes rolled fearfully to scan the room, as if he’d lately seen “them” everywhere and were puzzled by “their” disappearance. When my mother appeared in the doorway, he confided to me, in an even lower whisper, “I think they’ve gotten to your mother.”

My memories of the week that followed are mainly a blur, punctuated by a couple of life-changing scenes. I went to the hospital every day and sat with my father for as many hours as I could stand. At no point did he string together two coherent sentences. The memory that appears to me most significant in hindsight is a very peculiar one. It’s lit by a dreamlike indoor twilight, it’s set in a hospital room whose orientation and cramped layout are unfamiliar from any of my other memories, and it returns to me now without any of the chronological markers that usually characterize my memories. I’m not sure it even dates from that first week I saw my father in the hospital. And yet I’m sure that I’m not remembering a dream. All memories, the neuroscientists say, are actually memories of memory, but usually they don’t feel that way. Here’s one that does. I remember remembering: my father in bed, my mother sitting beside it, me standing near the door. We’ve been having an anguished family conversation, possibly about where to move my father after his discharge from the hospital. It’s a conversation that my father, to the slight extent that he can follow it, is hating. Finally he cries out with passionate emphasis, as if he’s had enough of all the nonsense, “I have always loved your mother. Always.” And my mother buries her face in her hands and sobs.

This was the only time I ever heard my father say he loved her. I’m certain the memory is legitimate because the scene seemed to me immensely significant even at the time, and I then described it to my wife and brothers and incorporated it into the story I was telling myself about my parents. In later years, when my mother insisted that my father had never said he loved her, not even once, I asked if she remembered that time in the hospital. I repeated what he’d said, and she shook her head uncertainly. “Maybe,” she said. “Maybe he did. I don’t remember that.”

My brothers and I took turns going to St. Louis every few months. My father never failed to recognize me as someone he was happy to see. His life in a nursing home appeared to be an endless troubled dream populated by figments from his past and by his deformed and brain-damaged fellow inmates; his nurses were less like actors in the dream than like unwelcome intruders on it. Unlike many of the female inmates, who at one moment were wailing like babies and at the next moment glowing with pleasure while someone fed them ice cream, I never saw my father cry, and the pleasure he took in ice cream never ceased to look like an adult’s. He gave me significant nods and wistful smiles as he confided to me fragments of nonsense to which I nodded as if I understood. His most consistently near-coherent theme was his wish to be removed from “this hotel” and his inability to understand why he couldn’t live in a little apartment and let my mother take care of him.

For Thanksgiving that year, my mother and my wife and I checked him out of the nursing home and brought him home with a wheelchair in my Volvo station wagon. He hadn’t been in the house since he’d last been living there, ten months earlier. If my mother had been hoping for a gratifying show of pleasure from him, she was disappointed; by then, a change of venue no more impressed my father than it does a one-year-old. We sat by the fireplace and, out of unthinking, wretched habit, took pictures of a man who, if he knew nothing else, seemed full of unhappy knowledge of how dismal a subject for photographs he was. The images are awful to me now: my father listing in his wheelchair like an unstrung marionette, eyes mad and staring, mouth sagging, glasses smeared with strobe light and nearly falling off his nose; my mother’s face a mask of reasonably well-contained despair; and my wife and I flashing grotesquely strained smiles as we reach to touch my father. At the dinner table my mother spread a bath towel over my father and cut his turkey into little bites. She kept asking him if he was happy to be having Thanksgiving dinner at home. He responded with silence, shifting eyes, sometimes a faint shrug. My brothers called to wish him a happy holiday; and here, out of the blue, he mustered a smile and a hearty voice, he was able to answer simple questions, he thanked them both for calling.

This much of the evening was typically Alzheimer’s. Because children learn social skills very early, a capacity for gestures of courtesy and phrases of vague graciousness survives in many Alzheimer’s patients long after their memories are shot. It wasn’t so remarkable that my father was able to handle (sort of) my brothers’ holiday calls. But consider what happened next, after dinner, outside the nursing home. While my wife ran inside for a geri chair, my father sat beside me and studied the institutional portal that he was about to reenter. “Better not to leave,” he told me in a clear, strong voice, “than to have to come back.” This was not a vague phrase; it pertained directly to the situation at hand, and it strongly suggested an awareness of his larger plight and his connection to the past and future. He was requesting that he be spared the pain of being dragged back toward consciousness and memory. And, sure enough, on the morning after Thanksgiving, and for the remainder of our visit, he was as crazy as I ever saw him, his words a hash of random syllables, his body a big flail of agitation.

For David Shenk, the most important of the “windows onto meaning” afforded by Alzheimer’s is its slowing down of death. Shenk likens the disease to a prism that refracts death into a spectrum of its otherwise tightly conjoined parts—death of autonomy, death of memory, death of self-consciousness, death of personality, death of body—and he subscribes to the most common trope of Alzheimer’s: that its particular sadness and horror stem from the sufferer’s loss of his or her “self” long before the body dies.

This seems mostly right to me. By the time my father’s heart stopped, I’d been mourning him for years. And yet, when I consider his story, I wonder whether the various deaths can ever really be so separated, and whether memory and consciousness have such secure title, after all, to the seat of selfhood. I can’t stop looking for meaning in the two years that followed his loss of his supposed “self,” and I can’t stop finding it.

I’m struck, above all, by the apparent persistence of his will. I’m powerless not to believe that he was exerting some bodily remnant of his self-discipline, some reserve of strength in the sinews beneath both consciousness and memory, when he pulled himself together for the request he made to me outside the nursing home. I’m powerless as well not to believe that his crash on the following morning, like his crash on his first night alone in a hospital, amounted to a relinquishment of that will, a letting-go, an embrace of madness in the face of unbearable emotion. Although we can fix the starting point of his decline (full consciousness and sanity) and the end point (oblivion and death), his brain wasn’t simply a computational device running gradually and inexorably amok. Where the subtractive progress of Alzheimer’s might predict a steady downward trend like this—






what I saw of my father’s fall looked more like this:






He held himself together longer, I suspect, than it might have seemed he had the neuronal wherewithal to do. Then he collapsed and fell lower than his pathology may have strictly dictated, and he chose to stay low, ninety-nine percent of the time. What he wanted (in the early years, to stay clear; in the later years, to let go) was integral to what he was. And what I want (stories of my father’s brain that are not about meat) is integral to what I choose to remember and retell.

One of the stories I’ve come to tell, then, as I try to forgive myself for my long blindness to his condition, is that he was bent on concealing that condition and, for a remarkably long time, retained the strength of character to bring it off. My mother used to swear that this was so. He couldn’t fool the woman he lived with, no matter how he bullied her, but he could pull himself together as long he had sons in town or guests in the house. The true solution of the conundrum of my stay with him during my mother’s operation probably has less to do with my blindness than with the additional will he was exerting.

After the bad Thanksgiving, when we knew he was never coming home again, I helped my mother sort through his desk. (It’s the kind of liberty you take with the desk of a child or a dead person.) In one of his drawers we found evidence of small, covert endeavors not to forget. There was a sheaf of papers on which he’d written the addresses of his children, one address per slip, the same address on several. On another slip he’d written the birth dates of his older sons—“Bob 1–13–48” and “TOM 10–15–50”—and then, in trying to recall mine (August 17, 1959), he had erased the month and day and made a guess on the basis of my brothers’ dates: “JON 10–13–49.”

Consider, too, what I believe are the last words he ever spoke to me, three months before he died. For a couple of days, I’d been visiting the nursing home for a dutiful ninety minutes and listening to his mutterings about my mother and to his affable speculations about certain tiny objects that he persisted in seeing on the sleeves of his sweater and the knees of his pants. He was no different when I dropped by on my last morning, no different when I wheeled him back to his room and told him I was heading out of town. But then he raised his face toward mine and—again, out of nowhere, his voice was clear and strong—he said: “Thank you for coming. I appreciate your taking the time to see me.”

Set phrases of courtesy? A window on his fundamental self? I seem to have little choice about which version to believe.

IN RELYING ON MY MOTHER’S LETTERS to reconstruct my father’s disintegration, I feel the shadow of the undocumented years after 1992, when she and I talked on the phone at greater length and ceased to write all but the briefest notes. Plato’s description of writing, in the Phaedrus, as a “crutch of memory” seems to me fully accurate: I couldn’t tell a clear story of my father without those letters. But, where Plato laments the decline of the oral tradition and the atrophy of memory which writing induces, I at the other end of the Age of the Written Word am impressed by the sturdiness and reliability of words on paper. My mother’s letters are truer and more complete than my self-absorbed and biased memories; she’s more alive to me in the written phrase “he NEEDS distractions!” than in hours of videotape or stacks of pictures of her.

The will to record indelibly, to set down stories in permanent words, seems to me akin to the conviction that we are larger than our biologies. I wonder if our current cultural susceptibility to the charms of materialism—our increasing willingness to see psychology as chemical, identity as genetic, and behavior as the product of bygone exigencies of human evolution—isn’t intimately related to the postmodern resurgence of the oral and the eclipse of the written: our incessant telephoning, our ephemeral e-mailing, our steadfast devotion to the flickering tube.

Have I mentioned that my father, too, wrote letters? Usually typewritten, usually prefaced with an apology for misspellings, they came much less frequently than my mother’s. One of the last is from December 1987:



This time of the year is always difficult for me. I’m ill at ease with all the gift-giving, as I would love to get things for people but lack the imagination to get the right things. I dread the shopping for things that are the wrong size or the wrong color or something not needed, and anticipate the problems of returning or exchanging. I like to buy tools, but Bob pointed out a problem with this category, when for some occasion I gave him a nice little hammer with good balance, and his comment was that this was the second or third hammer and I don’t need any more, thank you. And then there is the problem of gifts for your mother. She is so sentimental that it hurts me not to get her something nice, but she has access to my checking account with no restrictions. I have told her to buy something for herself, and say it is from me, so she can compete with the after-Christmas comment: “See what I got from my husband!” But she won’t participate in that fraud. So I suffer through the season.

In 1989, as his powers of concentration waned with his growing “nervousness & depression,” my father stopped writing letters altogether. My mother and I were therefore amazed to find, in the same drawer in which he’d left those addresses and birth dates, an unsent letter dated January 22, 1993—unimaginably late, a matter of weeks before his final breakdown. The letter was in an envelope addressed to my nephew Nick, who, at age six, had just begun to write letters himself. Possibly my father was ashamed to send a letter that he knew wasn’t fully coherent; more likely, given the state of his hippocampal health, he simply forgot. The letter, which for me has become an emblem of invisibly heroic exertions of the will, is written in a tiny penciled script that keeps veering away from the horizontal:



Dear Nick,

We got your letter a couple days ago and were pleased to see how well you were doing in school, particularly in math. It is important to write well, as the ability to exchange ideas will govern the use that one country can make of another country’s ideas.

Most of your nearest relatives are good writers, and thereby took the load off me. I should have learned better how to write, but it is so easy to say, Let Mom do it.

I know that my writing will not be easy to read, but I have a problem with the nerves in my legs and tremors in my hands. In looking at what I have written, I expect you will have difficulty to understand, but with a little luck, I may keep up with you.

We have had a change in the weather from cold and wet to dry with fair blue skies. I hope it stays this way. Keep up the good work.

Love, Grandpa

P.S. Thank you for the gifts.



MY FATHER’S HEART and lungs were very strong, and my mother was bracing herself for two or three more years of endgame when, one day in April 1995, he stopped eating. Maybe he was having trouble swallowing, or maybe, with his remaining shreds of will, he’d resolved to put an end to his unwanted second childhood.

His blood pressure was seventy over palpable when I flew into town. Again, my mother took me straight to the nursing home from the airport. I found him curled up on his side under a thin sheet, breathing shallowly, his eyes shut loosely. His muscle had wasted away, but his face was smooth and calm and almost entirely free of wrinkles, and his hands, which had changed not at all, seemed curiously large in comparison to the rest of him. There’s no way to know if he recognized my voice, but within minutes of my arrival his blood pressure climbed to 120/90. I worried then, worry even now, that I made things harder for him by arriving: that he’d reached the point of being ready to die but was ashamed to perform such a private or disappointing act in front of one of his sons.

My mother and I settled into a rhythm of watching and waiting, one of us sleeping while the other sat in vigil. Hour after hour, my father lay unmoving and worked his way toward death; but when he yawned, the yawn was his. And his body, wasted though it was, was likewise still radiantly his. Even as the surviving parts of his self grew ever smaller and more fragmented, I persisted in seeing a whole. I still loved, specifically and individually, the man who was yawning in that bed. And how could I not fashion stories out of that love—stories of a man whose will remained intact enough to avert his face when I tried to clear his mouth out with a moist foam swab? I’ll go to my own grave insisting that my father was determined to die and to die, as best he could, on his own terms.

We, for our part, were determined that he not be alone when he died. Maybe this was exactly wrong, maybe all he was waiting for was to be left alone. Nevertheless, on my sixth night in town, I stayed up and read a light novel cover to cover while he lay and breathed and loosed his great yawns. A nurse came by, listened to his lungs, and told me he must never have been a smoker. She suggested that I go home to sleep, and she offered to send in a particular nurse from the floor below to visit him. Evidently, the nursing home had a resident angel of death with a special gift for persuading the nearly dead, after their relatives had left for the night, that it was OK for them to die. I declined the nurse’s offer and performed this service myself. I leaned over my father, who smelled faintly of acetic acid but was otherwise clean and warm. Identifying myself, I told him that whatever he needed to do now was fine by me, he should let go and do what he needed to do.

Late that afternoon, a big early-summer St. Louis wind kicked up. I was scrambling eggs when my mother called from the nursing home and told me to hurry over. I don’t know why I thought I had plenty of time, but I ate the eggs with some toast before I left, and in the nursing-home parking lot I sat in the car and turned up the radio, which was playing the Blues Traveler song that was all the rage that season. No song has ever made me happier. The great white oaks all around the nursing home were swaying and turning pale in the big wind. I felt as though I might fly away with happiness.

And still he didn’t die. The storm hit the nursing home in the middle of the evening, knocking out all but the emergency lighting, and my mother and I sat in the dark. I don’t like to remember how impatient I was for my father’s breathing to stop, how ready to be free of him I was. I don’t like to imagine what he was feeling as he lay there, what dim or vivid sensory or emotional forms his struggle took inside his head. But I also don’t like to believe that there was nothing.

Toward ten o’clock, my mother and I were conferring with a nurse in the doorway of his room, not long after the lights came back on, when I noticed that he was drawing his hands up toward his throat. I said, “I think something is happening.” It was agonal breathing: his chin rising to draw air into his lungs after his heart had stopped beating. He seemed to be nodding very slowly and deeply in the affirmative. And then nothing.

After we’d kissed him goodbye and signed the forms that authorized the brain autopsy, after we’d driven through flooding streets, my mother sat down in our kitchen and uncharacteristically accepted my offer of undiluted Jack Daniel’s. “I see now,” she said, “that when you’re dead you’re really dead.” This was true enough. But, in the slow-motion way of Alzheimer’s, my father wasn’t much deader now than he’d been two hours or two weeks or two months ago. We’d simply lost the last of the parts out of which we could fashion a living whole. There would be no new memories of him. The only stories we could tell now were the ones we already had.



[2001]




IMPERIAL BEDROOM (#ulink_f2f1c8c4-8fa0-5d1e-a59a-bd21da1c142c)


PRIVACY, privacy, the new American obsession: espoused as the most fundamental of rights, marketed as the most desirable of commodities, and pronounced dead twice a week.

Even before Linda Tripp pressed the “Record” button on her answering machine, commentators were warning us that “privacy is under siege,” that “privacy is in a dreadful state,” that “privacy as we now know it may not exist in the year 2000.” They say that both Big Brother and his little brother, John Q. Public, are shadowing me through networks of computers. They tell me that security cameras no bigger than spiders are watching from every shaded corner, that dour feminists are monitoring bedroom behavior and watercooler conversations, that genetic sleuths can decoct my entire being from a droplet of saliva, that voyeurs can retrofit ordinary camcorders with a filter that lets them see through people’s clothing. Then comes the flood of dirty suds from the Office of the Independent Counsel, oozing forth through official and commercial channels to saturate the national consciousness. The Monica Lewinsky scandal marks, in the words of the philosopher Thomas Nagel, “the culmination of a disastrous erosion” of privacy; it represents, in the words of the author Wendy Kaminer, “the utter disregard for privacy and individual autonomy that exists in totalitarian regimes.” In the person of Kenneth Starr, the “public sphere” has finally overwhelmed—shredded, gored, trampled, invaded, run roughshod over—“the private.”

The panic about privacy has all the finger-pointing and paranoia of a good old American scare, but it’s missing one vital ingredient: a genuinely alarmed public. Americans care about privacy mainly in the abstract. Sometimes a well-informed community unites to defend itself, as when Net users bombarded the White House with e-mails against the “clipper chip,” and sometimes an especially outrageous piece of news provokes a national outcry, as when the Lotus Development Corporation tried to market a CD-ROM containing financial profiles of nearly half the people in the country. By and large, though, even in the face of wholesale infringements like the war on drugs, Americans remain curiously passive. I’m no exception. I read the editorials and try to get excited, but I can’t. More often than not, I find myself feeling the opposite of what the privacy mavens want me to. It’s happened twice in the last month alone.

On the Saturday morning when the Times came carrying the complete text of the Starr report, what I felt as I sat alone in my apartment and tried to eat my breakfast was that my own privacy—not Clinton’s, not Lewinsky’s—was being violated. I love the distant pageant of public life. I love both the pageantry and the distance. Now a President was facing impeachment, and as a good citizen I had a duty to stay informed about the evidence, but the evidence here consisted of two people’s groping, sucking, and mutual self-deception. What I felt, when this evidence landed beside my toast and coffee, wasn’t a pretend revulsion to camouflage a secret interest in the dirt; I wasn’t offended by the sex qua sex; I wasn’t worrying about a potential future erosion of my own rights; I didn’t feel the President’s pain in the empathic way he’d once claimed to feel mine; I wasn’t repelled by the revelation that public officials do bad things; and, although I’m a registered Democrat, my disgust was of a different order from my partisan disgust at the news that the Giants have blown a fourth-quarter lead. What I felt I felt personally. I was being intruded on.

A couple of days later, I got a call from one of my credit-card providers, asking me to confirm two recent charges at a gas station and one at a hardware store. Queries like this are common nowadays, but this one was my first, and for a moment I felt eerily exposed. At the same time, I was perversely flattered that someone, somewhere, had taken an interest in me and had bothered to phone. Not that the young male operator seemed to care about me personally. He sounded like he was reading his lines from a laminated booklet. The strain of working hard at a job he almost certainly didn’t enjoy seemed to thicken his tongue. He tried to rush his words out, to speed through them as if in embarrassment or vexation at how nearly worthless they were, but they kept bunching up in his teeth, and he had to stop and extract them with his lips, one by one. It was the computer, he said, the computer that routinely, ah, scans the, you know, the pattern of charges … and was there something else he could help me with tonight? I decided that if this young person wanted to scroll through my charges and ponder the significance of my two fill-ups and my gallon of latex paint, I was fine with it.

So here’s the problem. On the Saturday morning the Starr Report came out, my privacy was, in the classic liberal view, absolute. I was alone in my home and unobserved, unbothered by neighbors, unmentioned in the news, and perfectly free, if I chose, to ignore the report and do the pleasantly al dente Saturday crossword; yet the report’s mere existence so offended my sense of privacy that I could hardly bring myself to touch the thing. Two days later, I was disturbed in my home by a ringing phone, asked to cough up my mother’s maiden name, and made aware that the digitized minutiae of my daily life were being scrutinized by strangers; and within five minutes I’d put the entire episode out of my mind. I felt encroached on when I was ostensibly safe, and I felt safe when I was ostensibly encroached on. And I didn’t know why.

THE RIGHT to privacy—defined by Louis Brandeis and Samuel Warren, in 1890, as “the right to be let alone”—seems at first glance to be an elemental principle in American life. It’s the rallying cry of activists fighting for reproductive rights, against stalkers, for the right to die, against a national health-care database, for stronger data-encryption standards, against paparazzi, for the sanctity of employee e-mail, and against employee drug testing. On closer examination, though, privacy proves to be the Cheshire cat of values: not much substance, but a very winning smile.

Legally, the concept is a mess. Privacy violation is the emotional core of many crimes, from stalking and rape to Peeping Tommery and trespass, but no criminal statute forbids it in the abstract. Civil law varies from state to state but generally follows a forty-year-old analysis by the legal scholar Dean William Prosser, who dissected the invasion of privacy into four torts: intrusion on my solitude, the publishing of private facts about me which are not of legitimate public concern, publicity that puts my character in a false light, and appropriation of my name or likeness without my consent. This is a crumbly set of torts. Intrusion looks a lot like criminal trespass, false light like defamation, and appropriation like theft; and the harm that remains when these extraneous offenses are subtracted is so admirably captured by the phrase “infliction of emotional distress” as to render the tort of privacy invasion all but superfluous. What really undergirds privacy is the classical liberal conception of personal autonomy or liberty. In the last few decades, many judges and scholars have chosen to speak of a “zone of privacy,” rather than a “sphere of liberty,” but this is a shift in emphasis, not in substance: not the making of a new doctrine but the repackaging and remarketing of an old one.

Whatever you’re trying to sell, whether it’s luxury real estate or Esperanto lessons, it helps to have the smiling word “private” on your side. Last winter, as the owner of a Bank One Platinum Visa Card, I was offered enrollment in a program called PrivacyGuard


, which, according to the literature promoting it, “puts you in the know about the very personal records available to your employer, insurers, credit card companies, and government agencies.” The first three months of PrivacyGuard


were free, so I signed up. What came in the mail then was paperwork: envelopes and request forms for a Credit Record Search and other searches, also a disappointingly undeluxe logbook in which to jot down the search results. I realized immediately that I didn’t care enough about, say, my driving records to wait a month to get them; it was only when I called PrivacyGuard


to cancel my membership, and was all but begged not to, that I realized that the whole point of this “service” was to harness my time and energy to the task of reducing Bank One Visa’s fraud losses.

Even issues that legitimately touch on privacy are rarely concerned with the actual emotional harm of unwanted exposure or intrusion. A proposed national Genetic Privacy Act, for example, is premised on the idea that my DNA reveals more about my identity and future health than other medical data do. In fact, DNA is as yet no more intimately revealing than a heart murmur, a family history of diabetes, or an inordinate fondness for Buffalo chicken wings. As with any medical records, the potential for abuse of genetic information by employers and insurers is chilling, but this is only tangentially a privacy issue; the primary harm consists of things like job discrimination and higher insurance premiums.

In a similar way, the problem of online security is mainly about nuts and bolts. What American activists call “electronic privacy” their European counterparts call “data protection.” Our term is exciting; theirs is accurate. If someone is out to steal your Amex number and expiration date, or if an evil ex-boyfriend is looking for your new address, you need the kind of hard-core secrecy that encryption seeks to guarantee. If you’re talking to a friend on the phone, however, you need only a feeling of privacy.

The social drama of data protection goes something like this: a hacker or an insurance company or a telemarketer gains access to a sensitive database, public-interest watchdogs bark loudly, and new firewalls go up. Just as most people are moderately afraid of germs but leave virology to the Centers for Disease Control, most Americans take a reasonable interest in privacy issues but leave the serious custodial work to experts. Our problem now is that the custodians have started speaking a language of panic and treating privacy not as one of many competing values but as the one value that trumps all others.

The novelist Richard Powers recently declared in a Times op-ed piece that privacy is a “vanishing illusion” and that the struggle over the encryption of digital communications is therefore as “great with consequence” as the Cold War. Powers defines “the private” as “that part of life that goes unregistered,” and he sees in the digital footprints we leave whenever we charge things the approach of “that moment when each person’s every living day will become a Bloomsday, recorded in complete detail and reproducible with a few deft keystrokes.” It is scary, of course, to think that the mystery of our identities might be reducible to finite data sequences. That Powers can seriously compare credit-card fraud and intercepted cell-phone calls to thermonuclear incineration, however, speaks mainly to the infectiousness of privacy panic. Where, after all, is it “registered” what Powers or anybody else is thinking, seeing, saying, wishing, planning, dreaming, and feeling ashamed of? A digital Ulysses consisting of nothing but a list of its hero’s purchases and other recordable transactions might run, at most, to four pages: was there really nothing more to Bloom’s day?

When Americans do genuinely sacrifice privacy, moreover, they do so for tangible gains in health or safety or efficiency. Most legalized infringements—HIV notification, airport X-rays, Megan’s Law, Breathalyzer roadblocks, the drug-testing of student athletes, laws protecting fetuses, laws protecting the vegetative, remote monitoring of automobile emissions, county-jail strip searches, even Ken Starr’s exposure of presidential corruption—are essentially public health measures. I resent the security cameras in Washington Square, but I appreciate the ones on a subway platform. The risk that someone is abusing my E-ZPass toll records seems to me comfortably low in comparison with my gain in convenience. Ditto the risk that some gossip rag will make me a victim of the First Amendment; with two hundred and seventy million people in the country, any individual’s chances of being nationally exposed are next to nil.

The legal scholar Lawrence Lessig has characterized Americans as “bovine” for making calculations like this and for thereby acquiescing in what he calls the “Sovietization” of personal life. The curious thing about privacy, though, is that simply by expecting it we can usually achieve it. One of my neighbors in the apartment building across the street spends a lot of time at her mirror examining her pores, and I can see her doing it, just as she can undoubtedly see me sometimes. But our respective privacies remain intact as long as neither of us feels seen. When I send a postcard through the U.S. mail, I’m aware in the abstract that mail handlers may be reading it, may be reading it aloud, may even be laughing at it, but I’m safe from all harm unless, by sheer bad luck, the one handler in the country whom I actually know sees the postcard and slaps his forehead and says, “Oh, jeez, I know this guy.”

OUR PRIVACY panic isn’t merely exaggerated. It’s founded on a fallacy. Ellen Alderman and Caroline Kennedy, in The Right to Privacy, sum up the conventional wisdom of privacy advocates like this: “There is less privacy than there used to be.” The claim has been made or implied so often, in so many books and editorials and talk-show dens, that Americans, no matter how passive they are in their behavior, now dutifully tell pollsters that they’re very much worried about privacy. From almost any historical perspective, however, the claim seems bizarre.

In 1890, an American typically lived in a small town under conditions of near-panoptical surveillance. Not only did his every purchase “register,” but it registered in the eyes and the memory of shopkeepers who knew him, his parents, his wife, and his children. He couldn’t so much as walk to the post office without having his movements tracked and analyzed by neighbors. Probably he grew up sleeping in the same bed with his siblings and possibly with his parents, too. Unless he was well off, his transportation—a train, a horse, his own two feet—either was communal or exposed him to the public eye.

In the suburbs and exurbs where the typical American lives today, tiny nuclear families inhabit enormous houses, in which each person has his or her own bedroom and, sometimes, bathroom. Compared even with suburbs in the sixties and seventies, when I was growing up, the contemporary condominium development or gated community offers a striking degree of anonymity. It’s no longer the rule that you know your neighbors. Communities increasingly tend to be virtual, the participants either faceless or firmly in control of the face they present. Transportation is largely private: the latest SUVs are the size of living rooms and come with onboard telephones, CD players, and TV screens; behind the tinted windows of one of these high-riding I-see-you-but-you-can’t-see-me mobile PrivacyGuard® units, a person can be wearing pajamas or a licorice bikini, for all anybody knows or cares. Maybe the government intrudes on the family a little more than it did a hundred years ago (social workers look in on the old and the poor, health officials require inoculations, the police inquire about spousal battery), but these intrusions don’t begin to make up for the small-town snooping they’ve replaced.

The “right to be left alone”? Far from disappearing, it’s exploding. It’s the essence of modern American architecture, landscape, transportation, communication, and mainstream political philosophy. The real reason that Americans are apathetic about privacy is so big as to be almost invisible: we’re flat-out drowning in privacy.

What’s threatened, then, isn’t the private sphere. It’s the public sphere. Much has been made of the discouraging effect that the Starr investigation may have on future aspirants to public office (only zealots and zeros need apply), but that’s just half of it. The public world of Washington, because it’s public, belongs to everyone. We’re all invited to participate with our votes, our patriotism, our campaigning, and our opinions. The collective weight of a population makes possible our faith in the public world as something larger and more enduring and more dignified than any messy individual can be in private. But, just as one sniper in a church tower can keep the streets of an entire town empty, one real grossout scandal can undermine that faith.

If privacy depends upon an expectation of invisibility, the expectation of visibility is what defines a public space. My “sense of privacy” functions to keep the public out of the private and to keep the private out of the public. A kind of mental Border collie yelps in distress when I feel that the line between the two has been breached. This is why the violation of a public space is so similar, as an experience, to the violation of privacy. I walk past a man taking a leak on a sidewalk in broad daylight (delivery-truck drivers can be especially self-righteous in their “Ya gotta go, ya gotta go” philosophy of bladder management), and although the man with the yawning fly is ostensibly the one whose privacy is compromised by the leak, I’m the one who feels the impingement. Flashers and sexual harassers and fellators on the pier and self-explainers on the crosstown bus all similarly assault our sense of the “public” by exposing themselves.

Since really serious exposure in public today is assumed to be synonymous with being seen on television, it would seem to follow that televised space is the premier public space. Many things that people say to me on television, however, would never be tolerated in a genuine public space—in a jury box, for example, or even on a city sidewalk. TV is an enormous, ramified extension of the billion living rooms and bedrooms in which it’s consumed. You rarely hear a person on the subway talking loudly about, say, incontinence, but on television it’s been happening for years. TV is devoid of shame, and without shame there can be no distinction between public and private. Last winter, an anchorwoman looked me in the eye and, in the tone of a close female relative, referred to a litter of babies in Iowa as “America’s seven little darlin’s.” It was strange enough, twenty-five years ago, to get Dan Rather’s reports on Watergate between spots for Geritol and Bayer aspirin, as if Nixon’s impending resignation were somehow located in my medicine chest. Now, shelved between ads for Promise margarine and Celebrity Cruises, the news itself is a soiled cocktail dress—TV the bedroom floor and nothing but.

Reticence, meanwhile, has become an obsolete virtue. People now readily name their diseases, rents, antidepressants. Sexual histories get spilled on first dates, Birkenstocks and cutoffs infiltrate the office on casual Fridays, telecommuting puts the boardroom in the bedroom, “softer” modern office design puts the bedroom in the boardroom, salespeople unilaterally address customers by their first name, waiters won’t bring me food until I’ve established a personal relationship with them, voice-mail machinery stresses the “I” in “I’m sorry, but I don’t understand what you dialed,” and cyberenthusiasts, in a particularly grotesque misnomer, designate as “public forums” pieces of etched silicon with which a forum’s unshaved “participant” may communicate while sitting crosslegged in tangled sheets. The networked world as a threat to privacy? It’s the ugly spectacle of a privacy triumphant.

A genuine public space is a place where every citizen is welcome to be present and where the purely private is excluded or restricted. One reason that attendance at art museums has soared in recent years is that museums still feel public in this way. After those tangled sheets, how delicious the enforced decorum and the hush, the absence of in-your-face consumerism. How sweet the promenading, the seeing and being seen. Everybody needs a promenade sometimes—a place to go when you want to announce to the world (not the little world of friends and family but the big world, the real world) that you have a new suit, or that you’re in love, or that you suddenly realize you stand a full inch taller when you don’t hunch your shoulders.

Unfortunately, the fully public place is a nearly extinct category. We still have courtrooms and the jury pool, commuter trains and bus stations, here and there a small-town Main Street that really is a main street rather than a strip mall, certain coffee bars, and certain city sidewalks. Otherwise, for American adults, the only halfway public space is the world of work. Here, especially in the upper echelons of business, codes of dress and behavior are routinely enforced, personal disclosures are penalized, and formality is still the rule. But these rituals extend only to the employees of the firm, and even they, when they become old, disabled, obsolete, or outsourceable, are liable to be expelled and thereby relegated to the tangled sheets.

The last big, steep-walled bastion of public life in America is Washington, D.C. Hence the particular violation I felt when the Starr Report crashed in. Hence the feeling of being intruded on. It was privacy invasion, all right: private life brutally invading the most public of public spaces. I don’t want to see sex on the news from Washington. There’s sex everywhere else I look—on sitcoms, on the Web, on dust jackets, in car ads, on the billboards at Times Square. Can’t there be one thing in the national landscape that isn’t about the bedroom? We all know there’s sex in the cloakrooms of power, sex behind the pomp and circumstance, sex beneath the robes of justice; but can’t we act like grownups and pretend otherwise? Pretend not that “no one is looking” but that everyone is looking?

For two decades now, business leaders and politicians across much of the political spectrum, both Gingrich Republicans and Clinton Democrats, have extolled the virtues of privatizing public institutions. But what better word can there be for Lewinskygate and the ensuing irruption of disclosures (the infidelities of Helen Chenoweth, of Dan Burton, of Henry Hyde) than “privatization”? Anyone who wondered what a privatized presidency might look like may now, courtesy of Mr. Starr, behold one.

IN DENIS JOHNSON’S SHORT STORY “Beverly Home,” the young narrator spends his days working at a nursing home for the hopelessly disabled, where there is a particularly unfortunate patient whom no one visits:



A perpetual spasm forced him to perch sideways on his wheelchair and peer down along his nose at his knotted fingers. This condition had descended on him suddenly. He got no visitors. His wife was divorcing him. He was only thirty-three, I believe he said, but it was hard to guess what he told about himself because he really couldn’t talk anymore, beyond clamping his lips repeatedly around his protruding tongue while groaning.

No more pretending for him! He was completely and openly a mess. Meanwhile the rest of us go on trying to fool each other.



In a coast-to-coast, shag-carpeted imperial bedroom, we could all just be messes and save ourselves the trouble of pretending. But who wants to live in a pajama-party world? Privacy loses its value unless there’s something it can be defined against. “Meanwhile the rest of us go on trying to fool each other”—and a good thing, too. The need to put on a public face is as basic as the need for the privacy in which to take it off. We need both a home that’s not like a public space and a public space that’s not like home.

Walking up Third Avenue on a Saturday night, I feel bereft. All around me, attractive young people are hunched over their StarTacs and Nokias with preoccupied expressions, as if probing a sore tooth, or adjusting a hearing aid, or squeezing a pulled muscle; personal technology has begun to look like a personal handicap. All I really want from a sidewalk is that people see me and let themselves be seen, but even this modest ideal is thwarted by cell-phone users and their unwelcome privacy. They say things like “Should we have couscous with that?” and “I’m on my way to Blockbuster.” They aren’t breaking any law by broadcasting these breakfast-nook conversations. There’s no PublicityGuard that I can buy, no expensive preserve of public life to which I can flee. Seclusion, whether in a suite at the Plaza or in a cabin in the Catskills, is comparatively effortless to achieve. Privacy is protected as both commodity and right; public forums are protected as neither. Like old-growth forests, they’re few and irreplaceable and should be held in trust by everyone. The work of maintaining them gets only harder as the private sector grows ever more demanding, distracting, and disheartening. Who has the time and energy to stand up for the public sphere? What rhetoric can possibly compete with the American love of “privacy”?

When I return to my apartment after dark, I don’t immediately turn my lights on. Over the years, it’s become a reflexive precaution on my part not to risk spooking exposed neighbors by flooding my living room with light, although the only activity I ever seem to catch them at is watching TV.

My skin-conscious neighbor is home with her husband tonight, and they seem to be dressing for a party. The woman, a vertical strip of whom is visible between the Levelors and the window frame, is wearing a bathrobe and a barrette and sitting in front of a mirror. The man, slickhaired, wearing suit pants and a white T-shirt, stands by the sofa in the other room and watches television in a posture that I recognize as uncommitted. Finally the woman disappears into the bedroom. The man puts on a white shirt and a necktie and perches sidesaddle on the arm of the sofa, still watching television, more involved with it now. The woman returns wearing a strapless yellow dress and looking like a whole different species of being. Happy the transformation! Happy the distance between private and public! I see a rapid back-and-forth involving jewelry, jackets, and a clutch purse, and then the couple, dressed to the nines, ventures out into the world.



[1998]




WHY BOTHER? (#ulink_0bf8463a-1704-5cff-af0c-7fc846a9a67d)


(The Harper’s Essay)

MY DESPAIR about the American novel began in the winter of 1991, when I fled to Yaddo, the artists’ colony in upstate New York, to write the last two chapters of my second book. My wife and I had recently separated, and I was leading a life of self-enforced solitude in New York City, working long days in a small white room, packing up ten years’ worth of communal property, and taking nighttime walks on avenues where Russian, Hindi, Korean, and Spanish were spoken in equal measure. Even deep in my Queens neighborhood, however, news could reach me through my TV set and my Times subscription. The country was preparing for war ecstatically, with rhetoric supplied by George Bush: “Vital issues of principle are at stake.” In Bush’s eighty-nine-percent approval rating, as in the near-total absence of public skepticism about the war, the United States seemed to me hopelessly unmoored from reality—dreaming of glory in the massacre of faceless Iraqis, dreaming of infinite oil for hour-long commutes, dreaming of exemption from the rules of history. And so I, too, was dreaming of escape. I wanted to hide from America. But when I got to Yaddo and realized that it was no haven—the Times came there daily, and my fellow colonists kept talking about Patriot missiles and yellow ribbons—I began to think that what I really needed was a monastery.

Then one afternoon, in Yaddo’s little library, I picked up and read Paula Fox’s short novel Desperate Characters. “She was going to get away with everything!” is the hope that seizes the novel’s main character, Sophie Bentwood, a childless Brooklynite who’s unhappily married to a conservative lawyer. Sophie used to translate French novels; now she’s so depressed that she can hardly even read them. Against the advice of the husband, Otto, she has given milk to a homeless cat, and the cat has repaid the kindness by biting her hand. Sophie immediately feels “vitally wounded”—she’s been bitten for “no reason” just as Josef K. is arrested for “no reason” in The Trial—but when the swelling in her hand subsides she becomes giddy with the hope of being spared rabies shots.

The “everything” Sophie wants to get away with, however, is more than her liberal self-indulgence with the cat. She wants to get away with reading Goncourt novels and eating omelettes aux fines herbes on a street where derelicts lie sprawled in their own vomit and in a country that’s fighting a dirty war in Vietnam. She wants to be spared the pain of confronting a future beyond her life with Otto. She wants to keep dreaming. But the novel’s logic won’t let her. She’s compelled, instead, to this equation of the personal and the social:

“God, if I am rabid, I am equal to what is outside,” she said out loud, and felt an extraordinary relief as though, at last, she’d discovered what it was that could create a balance between the quiet, rather vacant progression of the days she spent in this house, and those portents that lit up the dark at the edge of her own existence.

Desperate Characters, which was first published in 1970, ends with an act of prophetic violence. Breaking under the strain of his collapsing marriage, Otto Bentwood grabs a bottle of ink from Sophie’s escritoire and smashes it against their bedroom wall. The ink in which his law books and Sophie’s translations have been printed now forms an unreadable blot. The black lines on the wall are both a mark of doom and the harbinger of an extraordinary relief, the end to a fevered isolation.

With its equation of a crumbling marriage with a crumbling social order, Desperate Characters spoke directly to the ambiguities that I was experiencing that January. Was it a great thing or a horrible thing that my marriage was coming apart? And did the distress I was feeling derive from some internal sickness of the soul, or was it imposed on me by the sickness of society? That someone besides me had suffered from these ambiguities and had seen light on their far side—that Fox’s book had been published and preserved; that I could find company and consolation and hope in an object pulled almost at random from a bookshelf—felt akin to an instance of religious grace.

Yet even while I was being saved as a reader by Desperate Characters I was succumbing, as a novelist, to despair about the possibility of connecting the personal and the social. The reader who happens on Desperate Characters today will be as struck by the foreignness of the Bentwoods’ world as by its familiarity. A quarter-century has only broadened and confirmed the sense of cultural crisis that Fox was registering. But what now feels like the locus of that crisis—the banal ascendancy of television, the electronic fragmentation of public discourse—is nowhere to be seen in the novel. Communication for the Bentwoods meant books, a telephone, and letters. Portents didn’t stream uninterruptedly through a cable converter or a modem; they were only dimly glimpsed, on the margins of existence. An ink bottle, which now seems impossibly quaint, was still thinkable as a symbol in 1970.

In a winter when every house in the nation was haunted by the ghostly telepresences of Peter Arnett in Baghdad and Tom Brokaw in Saudi Arabia—a winter when the inhabitants of those houses seemed less like individuals than a collective algorithm for the conversion of media jingoism into an eighty-nine-percent approval rating—I was tempted to think that if a contemporary Otto Bentwood were breaking down, he would kick in the screen of his bedroom TV. But this would have missed the point. Otto Bentwood, if he existed in the nineties, would not break down, because the world would no longer even bear on him. As an unashamed elitist, an avatar of the printed word, and a genuinely solitary man, he belongs to a species so endangered as to be all but irrelevant in an age of electronic democracy. For centuries, ink in the form of printed novels has fixed discrete, subjective individuals within significant narratives. What Sophie and Otto were glimpsing, in the vatic black mess on their bedroom wall, was the disintegration of the very notion of a literary character. Small wonder they were desperate. It was still the sixties, and they had no idea what had hit them.



There was a siege going on: it had been going on for a long time, but the besieged themselves were the last to take it seriously.

—from Desperate Characters

WHEN I GOT OUT OF COLLEGE, in 1981, I hadn’t heard the news about the social novel’s death. I didn’t know that Philip Roth had long ago performed the autopsy, describing “American reality” as a thing that “stupefies … sickens … infuriates, and finally … is even a kind of embarrassment to one’s own meager imagination. The actuality is continually outdoing our talents …” I was in love with literature and with a woman to whom I’d been attracted in part because she was a brilliant reader. I had lots of models for the kind of uncompromising novel I wanted to write. I even had a model for an uncompromising novel that had found a big audience: Catch-22. Joseph Heller had figured out a way of outdoing the actuality, employing the illogic of modern warfare as a metaphor for the more general denaturing of American reality. His book had seeped into the national imagination so thoroughly that my Webster’s Ninth Collegiate gave no fewer than five shades of meaning for the title. That no challenging novel since Catch-22 had affected the culture anywhere near as deeply, just as no issue since the Vietnam War had galvanized so many alienated young Americans, was easily overlooked. In college my head had been turned by Marxism, and I believed that “monopoly capitalism” (as we called it) abounded with “negative moments” (as we called them) that a novelist could trick Americans into confronting if only he could package his subversive bombs in a sufficiently seductive narrative.

I began my first book as a twenty-two-year-old dreaming of changing the world. I finished it six years older. The one tiny world-historical hope I still clung to was to appear on KMOX Radio, “the Voice of St. Louis,” whose long, thoughtful author interviews I’d grown up listening to in my mother’s kitchen. My novel, The Twenty-Seventh City, was about the innocence of a Midwestern city—about the poignancy of St. Louis’s municipal ambitions in an age of apathy and distraction—and I looked forward to forty-five minutes with one of KMOX’s afternoon talk-show hosts, whom I imagined teasing out of me the themes that I’d left latent in the book itself. To the angry callers demanding to know why I hated St. Louis I would explain, in the brave voice of someone who had lost his innocence, that what looked to them like hate was in fact tough love. In the listening audience would be my family: my mother, who considered fiction-writing a socially irresponsible career, and my father, who hoped that one day he would pick up Time magazine and find me reviewed in it.

It wasn’t until The Twenty-Seventh City was published, in 1988, that I discovered how innocent I still was. The media’s obsessive interest in my youthfulness surprised me. So did the money. Boosted by the optimism of publishers who imagined that an essentially dark, contrarian entertainment might somehow sell a zillion copies, I made enough to fund the writing of my next book. But the biggest surprise—the true measure of how little I’d heeded my own warning in The Twenty-Seventh City—was the failure of my culturally engaged novel to engage with the culture. I’d intended to provoke; what I got instead was sixty reviews in a vacuum.

My appearance on KMOX was indicative. The announcer was a journeyman with a whiskey sunburn and a heartrending comb-over who clearly hadn’t read past chapter two. Beneath his boom mike he brushed at the novel’s pages as though he hoped to absorb the plot transdermally. He asked me the questions that everybody asked me: How did it feel to get such good reviews? (It felt great, I said.) Was the novel autobiographical? (It was not, I said.) How did it feel to be a local kid returning to St. Louis on a fancy book tour? It felt obscurely disappointing. But I didn’t say this. I’d already realized that the money, the hype, the limo ride to a Vogue shoot weren’t simply fringe benefits. They were the main prize, the consolation for no longer mattering to a culture.

EXACTLY HOW MUCH LESS novels now matter to the American mainstream than they did when Catch-22 was published is impossible to judge. But the ambitious young fiction writer can’t help noting that, in a recent USA Today survey of twenty-four hours in the life of American culture, there were twenty-one references to television, eight to film, seven to popular music, four to radio, and one to fiction (The Bridges of Madison County). Or that magazines like The Saturday Review, which in Joseph Heller’s heyday still vetted novels by the bushel, have entirely disappeared. Or that the Times Book Review nowadays runs as few as two full fiction reviews a week (fifty years ago, the fiction-to-nonfiction ratio was one to one).

The only mainstream American household I know well is the one I grew up in, and I can report that my father, who was not a reader, nevertheless had some acquaintance with James Baldwin and John Cheever, because Time magazine put them on its cover and Time, for my father, was the ultimate cultural authority. In the last decade, the magazine whose red border twice enclosed the face of James Joyce has devoted covers to Scott Turow and Stephen King. These are honorable writers; but no one doubts it was the size of their contracts that won them covers. The dollar is now the yardstick of cultural authority, and an organ like Time, which not long ago aspired to shape the national taste, now serves mainly to reflect it.

The literary America in which I found myself after I published The Twenty-Seventh City bore a strange resemblance to the St. Louis I’d grown up in: a once-great city that had been gutted and drained by white flight and superhighways. Ringing the depressed urban core of serious fiction were prosperous new suburbs of mass entertainments. Much of the inner city’s remaining vitality was concentrated in the black, Hispanic, Asian, gay, and women’s communities that had taken over the structures vacated by fleeing straight white males. MFA programs offered housing and workfare to the underemployed; a few crackpot city-loving artists continued to hole up in old warehouses; and visiting readers could still pay weekend visits to certain well-policed cultural monuments—the temple of Toni Morrison, the orchestra of John Updike, the Faulkner House, the Wharton Museum, and Mark Twain Park.

By the early nineties I was as depressed as the inner city of fiction. My second novel, Strong Motion, was a long, complicated story about a Midwestern family in a world of moral upheaval, and this time, instead of sending my bombs in a Jiffy-Pak mailer of irony and understatement, as I had with The Twenty-Seventh City, I’d come out throwing rhetorical Molotov cocktails. But the result was the same: another report card with A’s and B’s from the reviewers who had replaced the teachers whose approval, when I was younger, I had both craved and taken no satisfaction from; decent money; and the silence of irrelevance. Meanwhile, my wife and I had reunited in Philadelphia. For two years we’d bounced around in three time zones, trying to find a pleasant, inexpensive place in which we didn’t feel like strangers. Finally, after exhaustive deliberation, we’d rented a too-expensive house in yet another depressed city. That we then proceeded to be miserable seemed to confirm beyond all doubt that there was no place in the world for fiction writers.

In Philadelphia I began to make unhelpful calculations, multiplying the number of books I’d read in the previous year by the number of years I might reasonably be expected to live, and perceiving in the three-digit product not so much an intimation of mortality (though the news on that front wasn’t cheering) as a measure of the incompatibility of the slow work of reading and the hyperkinesis of modern life. All of a sudden it seemed as if the friends of mine who used to read no longer even apologized for having stopped. A young acquaintance who had been an English major, when I asked her what she was reading, replied: “You mean linear reading? Like when you read a book from start to finish?”

There’s never been much love lost between literature and the marketplace. The consumer economy loves a product that sells at a premium, wears out quickly or is susceptible to regular improvement, and offers with each improvement some marginal gain in usefulness. To an economy like this, news that stays news is not merely an inferior product; it’s an antithetical product. A classic work of literature is inexpensive, infinitely reusable, and, worst of all, unimprovable.

After the collapse of the Soviet Union, the American political economy had set about consolidating its gains, enlarging its markets, securing its profits, and demoralizing its few remaining critics. In 1993 I saw signs of the consolidation everywhere. I saw it in the swollen minivans and broadbeamed trucks that had replaced the automobile as the suburban vehicle of choice—these Rangers and Land Cruisers and Voyagers that were the true spoils of a war waged to keep American gasoline cheaper than dirt, a war that had played like a thousand-hour infomercial for high technology, a consumer’s war dispensed through commercial television. I saw leaf-blowers replacing rakes. I saw CNN holding hostage the travelers in airport lounges and the shoppers in supermarket checkout lines. I saw the 486 chip replacing the 386 and being replaced in turn by the Pentium so that, despite new economies of scale, the price of entry-level notebook computers never fell below a thousand dollars. I saw Penn State win the Blockbuster Bowl.

Even as I was sanctifying the reading of literature, however, I was becoming so depressed that I could do little after dinner but flop in front of the TV. We didn’t have cable, but I could always find something delicious: Phillies and Padres, Eagles and Bengals, M*A*S*H, Cheers, Homicide. Naturally, the more TV I watched, the worse I felt. If you’re a novelist and even you don’t feel like reading, how can you expect anybody else to read your books? I believed I ought to be reading, as I believed I ought to be writing a third novel. And not just any third novel. It had long been a prejudice of mine that putting a novel’s characters in a dynamic social setting enriched the story that was being told; that the glory of the genre consisted of its spanning of the expanse between private experience and public context. And what more vital context could there be than television’s short-circuiting of that expanse?

But I was paralyzed with the third book. I was torturing the story, stretching it to accommodate ever more of those things-in-the-world that impinge on the enterprise of fiction writing. The work of transparency and beauty and obliqueness that I wanted to write was getting bloated with issues. I’d already worked in contemporary pharmacology and TV and race and prison life and a dozen other vocabularies; how was I going to satirize Internet boosterism and the Dow Jones as well, while leaving room for the complexities of character and locale? Panic grows in the gap between the increasing length of the project and the shrinking time increments of cultural change: How to design a craft that can float on history for as long as it takes to build it? The novelist has more and more to say to readers who have less and less time to read: Where to find the energy to engage with a culture in crisis when the crisis consists in the impossibility of engaging with the culture? These were unhappy days. I began to think that there was something wrong with the whole model of the novel as a form of “cultural engagement.”

IN THE NINETEENTH CENTURY, when Dickens and Darwin and Disraeli all read one another’s work, the novel was the preeminent medium of social instruction. A new book by Thackeray or William Dean Howells was anticipated with the kind of fever that a late-December film release inspires today.

The big, obvious reason for the decline of the social novel is that modern technologies do a much better job of social instruction. Television, radio, and photographs are vivid, instantaneous media. Print journalism, too, in the wake of In Cold Blood, has become a viable creative alternative to the novel. Because they command large audiences, TV and magazines can afford to gather vast quantities of information quickly. Few serious novelists can pay for a quick trip to Singapore, or for the mass of expert consulting that gives serial TV dramas like E.R. and NYPD Blue their veneer of authenticity. The writer of average talent who wants to report on, say, the plight of illegal aliens would be foolish to choose the novel as a vehicle. Ditto the writer who wants to offend prevailing sensibilities. Portnoy’s Complaint, which even my mother once heard enough about to disapprove of, was probably the last American novel that could have appeared on Bob Dole’s radar as a nightmare of depravity. Today’s Baudelaires are hip-hop artists.

The essence of fiction is solitary work: the work of writing, the work of reading. I’m able to know Sophie Bentwood intimately, and to refer to her as casually as I would to a good friend, because I poured my own feelings of fear and estrangement into my construction of her. If I knew her only through a video of Desperate Characters (Shirley MacLaine made the movie in 1971, as a vehicle for herself), Sophie would remain an Other, divided from me by the screen on which I viewed her, by the surficiality of film, and by MacLaine’s star presence. At most, I might feel I knew MacLaine a little better.

Knowing MacLaine a little better, however, is what the country mainly wants. We live in a tyranny of the literal. The daily unfolding stories of O. J. Simpson, Timothy McVeigh, and Bill Clinton have an intense, iconic presence that relegates to a subordinate shadow-world our own untelevised lives. In order to justify their claim on our attention, the organs of mass culture and information are compelled to offer something “new” on a daily, indeed hourly, basis. Although good novelists don’t deliberately seek out trends, many of them feel a responsibility to pay attention to contemporary issues, and they now confront a culture in which almost all the issues are burned out almost all the time. The writer who wants to tell a story about society that’s true not just in 1996 but in 1997 as well can find herself at a loss for solid cultural referents. What’s topically relevant while she’s planning the novel will almost certainly be passé by the time it’s written, rewritten, published, distributed, and read.

None of this stops cultural commentators—notably Tom Wolfe—from blaming novelists for their retreat from social description. The most striking thing about Wolfe’s 1989 manifesto for the “New Social Novel,” even more than his uncanny ignorance of the many excellent socially engaged novels published between 1960 and 1989, was his failure to explain why his ideal New Social Novelist should not be writing scripts for Hollywood. And so it’s worth saying one more time: Just as the camera drove a stake through the heart of serious portraiture, television has killed the novel of social reportage. Truly committed social novelists may still find cracks in the monolith to sink their pitons into. But they do so with the understanding that they can no longer depend on their material, as Howells and Sinclair and Stowe did, but only on their own sensibilities, and with the expectation that no one will be reading them for news.

THIS MUCH, at least, was visible to Philip Roth in 1961. Noting that “for a writer of fiction to feel that he does not really live in his own country—as represented by Life or by what he experiences when he steps out the front door—must seem a serious occupational impediment,” he rather plaintively asked: “What will his subject be? His landscape?” In the intervening years, however, the screw has taken another turn. Our obsolescence now goes further than television’s usurpation of the role as news-bringer, and deeper than its displacement of the imagined with the literal. Flannery O’Connor, writing around the time that Roth made his remarks, insisted that the “business of fiction” is “to embody mystery through manners.” Like the poetics that Poe derived from his “Raven,” O’Connor’s formulation particularly flatters her own work, but there’s little question that “mystery” (how human beings avoid or confront the meaning of existence) and “manners” (the nuts and bolts of how human beings behave) have always been primary concerns of fiction writers. What’s frightening for a novelist today is how the technological consumerism that rules our world specifically aims to render both of these concerns moot.

O’Connor’s response to the problem that Roth articulated, to the sense that there is little in the national mediascape that novelists can feel they own, was to insist that the best American fiction has always been regional. This was somewhat awkward, since her hero was the cosmopolitan Henry James. But what she meant was that fiction feeds on specificity, and that the manners of a particular region have always provided especially fertile ground for its practitioners.

Superficially, at least, regionalism is still thriving. In fact it’s fashionable on college campuses nowadays to say that there is no America anymore, there are only Americas; that the only things a black lesbian New Yorker and a Southern Baptist Georgian have in common are the English language and the federal income tax. The likelihood, however, is that both the New Yorker and the Georgian watch Letterman every night, both are struggling to find health insurance, both have jobs that are threatened by the migration of employment overseas, both go to discount superstores to purchase Pocahontas tie-in products for their children, both are being pummeled into cynicism by commercial advertising, both play Lotto, both dream of fifteen minutes of fame, both are taking a serotonin reuptake inhibitor, and both have a guilty crush on Uma Thurman. The world of the present is a world in which the rich lateral dramas of local manners have been replaced by a single vertical drama, the drama of regional specificity succumbing to a commercial generality. The American writer today faces a cultural totalitarianism analogous to the political totalitarianism with which two generations of Eastern bloc writers had to contend. To ignore it is to court nostalgia. To engage with it, however, is to risk writing fiction that makes the same point over and over: technological consumerism is an infernal machine, technological consumerism is an infernal machine …

Equally discouraging is the fate of “manners” in the word’s more common sense. Rudeness, irresponsibility, duplicity, and stupidity are hallmarks of real human interaction: the stuff of conversation, the cause of sleepless nights. But in the world of consumer advertising and consumer purchasing, no evil is moral. The evils consist of high prices, inconvenience, lack of choice, lack of privacy, heartburn, hair loss, slippery roads. This is no surprise, since the only problems worth advertising solutions for are problems treatable through the spending of money. But money cannot solve the problem of bad manners—the chatterer in the darkened movie theater, the patronizing sister-in-law, the selfish sex partner—except by offering refuge in an atomized privacy. And such privacy is exactly what the American Century has tended toward. First there was mass suburbanization, then the perfection of at-home entertainment, and finally the creation of virtual communities whose most striking feature is that interaction within them is entirely optional—terminable the instant the experience ceases to gratify the user.

That all these trends are infantilizing has been widely noted. Less often remarked is the way in which they are changing both our expectations of entertainment (the book must bring something to us, rather than our bringing something to the book) and the very content of that entertainment. The problem for the novelist is not just that the average man or woman spends so little time F2F with his or her fellows; there is, after all, a rich tradition of epistolary novels, and Robinson Crusoe’s condition approximates the solitude of today’s suburban bachelor. The real problem is that the average man or woman’s entire life is increasingly structured to avoid the kinds of conflicts on which fiction, preoccupied with manners, has always thrived.

Here, indeed, we are up against what truly seems like the obsolescence of serious art in general. Imagine that human existence is defined by an Ache: the Ache of our not being, each of us, the center of the universe; of our desires forever outnumbering our means of satisfying them. If we see religion and art as the historically preferred methods of coming to terms with this Ache, then what happens to art when our technological and economic systems and even our commercialized religions become sufficiently sophisticated to make each of us the center of our own universe of choices and gratifications? Fiction’s response to the sting of poor manners, for example, is to render them comic. The reader laughs with the writer, feels less alone with the sting. This is a delicate transaction, and it takes some work. How can it compete with a system—screen your calls; go out by modem; acquire the money to deal exclusively with the privatized world, where workers must be courteous or lose their jobs—that spares you the sting in the first place?

In the long run, the breakdown of communitarianism is likely to have all sorts of nasty consequences. In the short run, however, in this century of amazing prosperity and health, the breakdown takes a heavy toll on the ancient methods of dealing with the Ache. As for the sense of loneliness and pointlessness and loss that social atomization may produce—stuff that can be lumped under O’Connor’s general heading of mystery—it’s already enough to label it a disease. A disease has causes: abnormal brain chemistry, childhood sexual abuse, welfare queens, the patriarchy, social dysfunction. It also has cures: Zoloft, recovered-memory therapy, the Contract with America, multiculturalism, the World Wide Web. A partial cure, or better yet, an endless succession of partial cures, but failing that, even just the consolation of knowing you have a disease—anything is better than mystery. Science attacked religious mystery a long time ago. But it was not until applied science, in the form of technology, changed both the demand for fiction and the social context in which fiction is written that we novelists fully felt its effects.

EVEN NOW, even when I carefully locate my despair in the past tense, it’s difficult for me to confess to all these doubts. In publishing circles, confessions of doubt are widely referred to as “whining”—the idea being that cultural complaint is pathetic and self-serving in writers who don’t sell, ungracious in writers who do. For people as protective of their privacy and as fiercely competitive as writers are, mute suffering would seem to be the safest course. However sick with foreboding you feel inside, it’s best to radiate confidence and to hope that it’s infectious. When a writer says publicly that the novel is doomed, it’s a sure bet his new book isn’t going well; in terms of his reputation, it’s like bleeding in shark-infested waters.

Even harder to admit is how depressed I was. As the social stigma of depression dwindles, the aesthetic stigma increases. It’s not just that depression has become fashionable to the point of banality. It’s the sense that we live in a reductively binary culture: you’re either healthy or you’re sick, you either function or you don’t. And if that flattening of the field of possibilities is precisely what’s depressing you, you’re inclined to resist participating in the flattening by calling yourself depressed. You decide that it’s the world that’s sick, and that the resistance of refusing to function in such a world is healthy. You embrace what clinicians call “depressive realism.” It’s what the chorus in Oedipus Rex sings: “Alas, ye generations of men, how mere a shadow do I count your life! Where, where is die mortal who wins more of happiness than just the seeming, and, after the semblance, a falling away?” You are, after all, just protoplasm, and some day you’ll be dead. The invitation to leave your depression behind, whether through medication or therapy or effort of will, seems like an invitation to turn your back on all your dark insights into the corruption and infantilism and self-delusion of the brave new McWorld. And these insights are the sole legacy of the social novelist who desires to represent the world not simply in its detail but in its essence, to shine light on the morally blind eye of the virtual whirlwind, and who believes that human beings deserve better than the future of attractively priced electronic panderings that is even now being conspired for them. Instead of saying I am depressed, you want to say I am right!

But all the available evidence suggests that you have become a person who’s impossible to live with and no fun to talk to. And as you increasingly feel, as a novelist, that you are one of the last remaining repositories of depressive realism and of the radical critique of the therapeutic society that it represents, the burden of news-bringing that is placed on your art becomes overwhelming. You ask yourself, why am I bothering to write these books? I can’t pretend the mainstream will listen to the news I have to bring. I can’t pretend I’m subverting anything, because any reader capable of decoding my subversive messages does not need to hear them (and the contemporary art scene is a constant reminder of how silly things get when artists start preaching to the choir). I can’t stomach any kind of notion that serious fiction is good for us, because I don’t believe that everything that’s wrong with the world has a cure, and even if I did, what business would I, who feel like the sick one, have in offering it? It’s hard to consider literature a medicine, in any case, when reading it serves mainly to deepen your depressing estrangement from the mainstream; sooner or later the therapeutically minded reader will end up fingering reading itself as the sickness. Sophie Bentwood, for instance, has “candidate for Prozac” written all over her. No matter how gorgeous and comic her torments are, and no matter how profoundly human she appears in light of those torments, a reader who loves her can’t help wondering whether perhaps treatment by a mental-health-care provider wouldn’t be the best course all around.

I resist, finally, the notion of literature as a noble higher calling, because elitism doesn’t sit well with my American nature, and because even if my belief in mystery didn’t incline me to distrust feelings of superiority, my belief in manners would make it difficult for me to explain to my brother, who is a fan of Michael Crichton, that the work I’m doing is simply better than Crichton’s. Not even the French poststructuralists, with their philosophically unassailable celebration of the “pleasure of the text,” can help me out here, because I know that no matter how metaphorically rich and linguistically sophisticated Desperate Characters is, what I experienced when I first read it was not some erotically joyous lateral slide of endless associations, but something coherent and deadly pertinent. I know there’s a reason I loved reading and loved writing. But every apology and every defense seems to dissolve in the sugar water of contemporary culture, and before long it becomes difficult indeed to get out of bed in the morning.

TWO QUICK GENERALIZATIONS about novelists: we don’t like to poke too deeply into the question of audience, and we don’t like the social sciences. How awkward, then, that for me the beacon in the murk—the person who inadvertently did the most to get me back on track as a writer—should have been a social scientist who was studying the audience for serious fiction in America.

Shirley Brice Heath is a MacArthur Fellow, a linguistic anthropologist, and a professor of English and linguistics at Stanford; she’s a stylish, twiggy, white-haired lady with no discernible tolerance for small talk. Throughout the eighties, Heath haunted what she calls “enforced transition zones”—places where people are held captive without recourse to television or other comforting pursuits. She rode public transportation in twenty-seven different cities. She lurked in airports (at least before the arrival of CNN). She took her notebook into bookstores and seaside resorts. Whenever she saw people reading or buying “substantive works of fiction” (meaning, roughly, trade-paperback fiction), she asked for a few minutes of their time. She visited summer writers’ conferences and creative-writing programs to grill ephebes. She interviewed novelists. Three years ago she interviewed me, and last summer I had lunch with her in Palo Alto.




Конец ознакомительного фрагмента.


Текст предоставлен ООО «ЛитРес».

Прочитайте эту книгу целиком, купив полную легальную версию (https://www.litres.ru/raznoe-12198200/how-to-be-alone/) на ЛитРес.

Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона, с платежного терминала, в салоне МТС или Связной, через PayPal, WebMoney, Яндекс.Деньги, QIWI Кошелек, бонусными картами или другим удобным Вам способом.



Если текст книги отсутствует, перейдите по ссылке

Возможные причины отсутствия книги:
1. Книга снята с продаж по просьбе правообладателя
2. Книга ещё не поступила в продажу и пока недоступна для чтения

Навигация